410 research outputs found
Closed-Form Bayesian Inferences for the Logit Model via Polynomial Expansions
Articles in Marketing and choice literatures have demonstrated the need for
incorporating person-level heterogeneity into behavioral models (e.g., logit
models for multiple binary outcomes as studied here). However, the logit
likelihood extended with a population distribution of heterogeneity doesn't
yield closed-form inferences, and therefore numerical integration techniques
are relied upon (e.g., MCMC methods).
We present here an alternative, closed-form Bayesian inferences for the logit
model, which we obtain by approximating the logit likelihood via a polynomial
expansion, and then positing a distribution of heterogeneity from a flexible
family that is now conjugate and integrable. For problems where the response
coefficients are independent, choosing the Gamma distribution leads to rapidly
convergent closed-form expansions; if there are correlations among the
coefficients one can still obtain rapidly convergent closed-form expansions by
positing a distribution of heterogeneity from a Multivariate Gamma
distribution. The solution then comes from the moment generating function of
the Multivariate Gamma distribution or in general from the multivariate
heterogeneity distribution assumed.
Closed-form Bayesian inferences, derivatives (useful for elasticity
calculations), population distribution parameter estimates (useful for
summarization) and starting values (useful for complicated algorithms) are
hence directly available. Two simulation studies demonstrate the efficacy of
our approach.Comment: 30 pages, 2 figures, corrected some typos. Appears in Quantitative
Marketing and Economics vol 4 (2006), no. 2, 173--20
Choice Models in Marketing: Economic Assumptions, Challenges and Trends
Direct utility models of consumer choice are reviewed and developed for understanding consumer preferences. We begin with a review of statistical models of choice, posing a series of modeling challenges that are resolved by considering economic foundations based on con-strained utility maximization. Direct utility models differ from other choice models by directly modeling the consumer utility function used to derive the likelihood of the data through Kuhn-Tucker con-ditions. Recent advances in Bayesian estimation make the estimation of these models computationally feasible, offering advantages in model interpretation over models based on indirect utility, and descriptive models that tend to be highly parameterized. Future trends are dis-cussed in terms of the antecedents and enhancements of utility function specification.
Recommended from our members
A practitioner's guide to Bayesian estimation of discrete choice dynamic programming models
This paper provides a step-by-step guide to estimating infinite horizon discrete choice dynamic programming (DDP) models using a new Bayesian estimation algorithm (Imai et al., Econometrica 77:1865–1899, 2009a) (IJC). In the conventional nested fixed point algorithm, most of the information obtained in the past iterations remains unused in the current iteration. In contrast, the IJC algorithm extensively uses the computational results obtained from the past iterations to help solve the DDP model at the current iterated parameter values. Consequently, it has the potential to significantly alleviate the computational burden of estimating DDP models. To illustrate this new estimation method, we use a simple dynamic store choice model where stores offer “frequent-buyer” type rewards programs. Our Monte Carlo results demonstrate that the IJC method is able to recover the true parameter values of this model quite precisely. We also show that the IJC method could reduce the estimation time significantly when estimating DDP models with unobserved heterogeneity, especially when the discount factor is close to 1
Interdependent Infrastructure as Linked Social, Ecological, and Technological Systems (SETSs) to Address Lock‐in and Enhance Resilience
Traditional infrastructure adaptation to extreme weather events (and now climate change) has typically been techno‐centric and heavily grounded in robustness—the capacity to prevent or minimize disruptions via a risk‐based approach that emphasizes control, armoring, and strengthening (e.g., raising the height of levees). However, climate and nonclimate challenges facing infrastructure are not purely technological. Ecological and social systems also warrant consideration to manage issues of overconfidence, inflexibility, interdependence, and resource utilization—among others. As a result, techno‐centric adaptation strategies can result in unwanted tradeoffs, unintended consequences, and underaddressed vulnerabilities. Techno‐centric strategies that lock‐in today\u27s infrastructure systems to vulnerable future design, management, and regulatory practices may be particularly problematic by exacerbating these ecological and social issues rather than ameliorating them. Given these challenges, we develop a conceptual model and infrastructure adaptation case studies to argue the following: (1) infrastructure systems are not simply technological and should be understood as complex and interconnected social, ecological, and technological systems (SETSs); (2) infrastructure challenges, like lock‐in, stem from SETS interactions that are often overlooked and underappreciated; (3) framing infrastructure with a SETS lens can help identify and prevent maladaptive issues like lock‐in; and (4) a SETS lens can also highlight effective infrastructure adaptation strategies that may not traditionally be considered. Ultimately, we find that treating infrastructure as SETS shows promise for increasing the adaptive capacity of infrastructure systems by highlighting how lock‐in and vulnerabilities evolve and how multidisciplinary strategies can be deployed to address these challenges by broadening the options for adaptation
A genomic biomarker signature can predict skin sensitizers using a cell-based in vitro alternative to animal tests
<p>Abstract</p> <p>Background</p> <p>Allergic contact dermatitis is an inflammatory skin disease that affects a significant proportion of the population. This disease is caused by an adverse immune response towards chemical haptens, and leads to a substantial economic burden for society. Current test of sensitizing chemicals rely on animal experimentation. New legislations on the registration and use of chemicals within pharmaceutical and cosmetic industries have stimulated significant research efforts to develop alternative, human cell-based assays for the prediction of sensitization. The aim is to replace animal experiments with in vitro tests displaying a higher predictive power.</p> <p>Results</p> <p>We have developed a novel cell-based assay for the prediction of sensitizing chemicals. By analyzing the transcriptome of the human cell line MUTZ-3 after 24 h stimulation, using 20 different sensitizing chemicals, 20 non-sensitizing chemicals and vehicle controls, we have identified a biomarker signature of 200 genes with potent discriminatory ability. Using a Support Vector Machine for supervised classification, the prediction performance of the assay revealed an area under the ROC curve of 0.98. In addition, categorizing the chemicals according to the LLNA assay, this gene signature could also predict sensitizing potency. The identified markers are involved in biological pathways with immunological relevant functions, which can shed light on the process of human sensitization.</p> <p>Conclusions</p> <p>A gene signature predicting sensitization, using a human cell line in vitro, has been identified. This simple and robust cell-based assay has the potential to completely replace or drastically reduce the utilization of test systems based on experimental animals. Being based on human biology, the assay is proposed to be more accurate for predicting sensitization in humans, than the traditional animal-based tests.</p
- …